AI guidelines

The AI landscape is evolving rapidly, with its capabilities constantly improving and new tools being released daily. The current parameters on using Generative AI at Penn State are described below. Note that these guidelines for AI use may change as the field progresses. Ethical considerations, regulations, and guidelines surrounding Generative AI must be regularly reviewed due to its ever-evolving nature.

Guidance & FAQs

General Guidance

Penn State encourages safe exploration and use of generative AI tools to further our teaching, research, and service mission. Keep these guidelines in mind:

  • Be sure to follow applicable state and federal laws, including, but not limited to, the Family Educational Rights and Privacy Act (FERPA), the Health Insurance Portability and Accountability Act (HIPAA), and the Gramm-Leach-Bliley Act (GLBA).
  • Learn how to prompt AI to improve your chances of getting useful output.
  • Determine if you need to disclose your use of AI or if there are other restrictions on its use. Students should check with instructors, staff with their supervisors, and researchers with journal editors.
  • Be sure to verify that the AI output is accurate, reliable, and contains appropriate citations to real sources.
  • Be sure to check the AI output for bias, discrimination, and other ethical issues.
  • If you have any information that you want to remain confidential, including proprietary work, as well as your or other’s personal information, don’t use publicly available AI tools. The information could be used to train the large language model and could be inadvertently conveyed to others.
  • For more detailed guidelines and information about ethical use of AI, click to read a breakdown of Penn State’s AI and ethics values.
 
Research Guidance 

Accessibility Considerations

AI tools have varying levels of accessibility which can disadvantage some users.  When using an AI tool for more than one user such as in a class or an administrative office, an accessibility review is needed prior to use to comply with policy AD69.

To initiate your review, send an email to accessibility@psu.edu with the name of the tool, your intended use, and the expected number of users.  Below are EEAAP templates for tools that have already been reviewed.  You must submit an EEAAP for your specific group use of these tools.

EEAAP Templates

NOTE:  Accessibility Office approval does not mean you can use the tool for all types of information. You must still ensure that the tool is approved for the information that you’re putting in the system per the table below.

All requests for disability related accommodations must be individually evaluated in accordance with Federal law. Inquiries about student accommodations should be directed to Student Disability Resources. Inquiries about faculty and staff accommodation should be directed to the Office of Equal Opportunity and Access.

 

Generative AI Icons Course for University Instructors

“Can I use Generative AI (GenAI) on this assignment?” How often do you as an instructor receive this question? Or have you found yourself wishing that your students had asked it before using GenAI on an assignment?
Three icons labeled Allowed, Limited, and Not Permitted indicating levels of AI usage permitted.

As GenAI tools become more and more prevalent, providing guidance to students has become crucial. The GenAI Use Icons is a framework to assist Penn State instructors in communicating their policies regarding GenAI utilization in coursework. It enables instructors to clearly articulate whether GenAI tools are or are not allowed for specific assignments. These icons can be used to promote transparency, ethical GenAI usage, and academic integrity among students. They are meant to be a tool to foster conversation between students and faculty as to the expectations for successful completion of assignments.

To learn more, check out the GenAI Icons module of the Canvas Styles Resource.

Proposed Use of AI Software to Assist with Grading Student Work

Instructors need to secure approval prior to using courseware. Software being used to assist with grading is classified as courseware. Before submitting a request for an AI-assisted grading software, approval is needed (email is sufficient) from both the department head and dean, or appropriate academic leadership at commonwealth campuses.  
  
Is the courseware already on the Reviewed Courseware List
Note: Any use of software to assist with grading must comply with Senate Policy 47-20 Basis for Grades: Instructors must assign grades based on their professional judgment of student achievement. When using tools to assist with grading, instructors must review student work and verify grade accuracy. Without proper instructor review, students could successfully appeal grades through the Grade Mediation process (AAPPM G-10). 

AI Tools and Platforms

Although the University does not endorse any particular generative AI tool, employees are permitted to use generative AI tools of their choice without any review as long as only public, non-confidential data is involved in such use. These individual use cases do not require any sort of delegation or completion of the Software Request Form. If use is intended for a group and/or class, see the Accessibility Considerations guidance above for accessibility compliance.

AI tools and platforms are available based on the sensitivity of information being processed and institutional agreements with vendors that include additional protections. The tables describe the tools and platforms available for each level of information and how to obtain access.  

Penn State Information Technology (IT) Learning and Development's AI Guides Program

The AI Guides Program aims to help University faculty and staff members take the first steps in learning about and using Penn State-approved AI tools like Microsoft Copilot.

The program will provide introductory-level support to help increase AI skills; guidance on crafting prompts to get the most out of AI tools; one-on-one consultations; brainstorming and strategy sessions to enhance teaching, learning and workplace tasks; tips and best practices; opportunities to explore AI content generation and creating AI assistants; and techniques to validate AI-generated content for accuracy and reliability.

The soft launch of the program will run from Feb. 2 through May 29, 2026. 

Click to visit the Penn State IT website and request a consultation.

Digital Learning Academic Committee (DLAC) AI Tools Report for Selection and Critical Evaluation

This report, sponsored by the Digital Learning Academic Committee (DLAC), serves as a resource to inform decisions about the selection and implementation of AI tools for a range of teaching and learning contexts. It provides critical and well-organized information about common tools and their uses as well as risks and mitigation strategies, along with real-world Penn State examples. 
 
Those who are involved in evaluation of AI tools for use in individual courses through crafting unit-wide guidance for tool selection will find valuable information in this report. 
 
View report.
 
Questions can be directed to Chris Millet (cxm470@psu.edu), Andy Pron (amp8076@psu.edu), or Louis Leblond (lul29@psu.edu). 

General Use AI Tools

This category of tools includes “chatbots” and AI assistants for general use and productivity. They are designed to understand and generate human-like responses to text-based, natural language prompts. They can generate text, code, and images, translate languages, write different kinds of creative content, or integrate with productivity and collaboration tools. 

Information levels refer to Penn State’s information classification types (Level 1-4).

ToolInformation Allowed and ProhibitedAvailability
Microsoft Copilot (formerly Bing Chat Enterprise) Level 1 and 2: allowed
Level 3 and 4: prohibited
Available to faculty, staff, and students (over 18 years of age)
($) Microsoft Copilot for M365 integrated with M365 Tools (Teams, Outlook, Word, Excel, PowerPoint, etc)Level 1 and 2: allowed
Level 3 and 4: prohibited
Purchase through Software@Penn State catalog
All other general AI tools (ChatGPT, Claude, Gemini, etc.)Public, non-confidential information: allowed

Information NOT public or confidential: request via Software Request Form

Free or purchased

NOTE: Consider using already purchased tools such as Microsoft Copilot before purchasing other tools

($) Purchase required

Platforms for Building with AI

This category of tools includes API access to enable developers to integrate Large Language Models (LLMs) into their own applications, products, or services. This includes chatbot creation and customization, building and testing applications, access to model training and deployment, coding, predictive analytics, and more. Code and low/no-code offerings are available. These tools are subject to change based on availability.

PlatformDescriptionInformation Allowed and ProhibitedRequesting Access
($) Azure AI Build generative AI models from Azure OpenAI Service, Falcon, Stable Diffusion and Meta.Level 1 and 2: allowed
Level 3: allowed with approved Secure Enclave request)
Level 4: prohibited
Faculty and staff may request a cloud provider account via ServiceNow
($) Amazon BedrockBuild generative AI applications using models from AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, and Stability AI.Level 1 and 2: allowed
Level 3: allowed with approved Secure Enclave request)
Level 4: prohibited
Faculty and staff may request a cloud provider account via ServiceNow
($) Google Vertex AIBuild generative AI applications using Google and Open Source models.

https://cloud.google.com/vertex-ai/generative-ai/docs/model-garden/explore-models
Level 1 and 2: allowed
Level 3: allowed with approved Secure Enclave request)
Level 4: prohibited
Faculty and staff may request a cloud provider account via ServiceNow
All other platformsRequest via Software Request Form

We welcome questions or comments at aihub@psu.edu

Subscribe Today & get all the latest Penn State AI News in your inbox!

* indicates required